Learning Linear Functions with Quadratic and Linear Multiplicative Updates
نویسنده
چکیده
We analyze variations of multiplicative updates for learning linear functions online. These can be described as substituting exponentiation in the Exponentiated Gradient (EG) algorithm with quadratic and linear functions. Both kinds of updates substitute exponentiation with simpler operations and reduce dependence on the parameter that specifies the sum of the weights during learning. In particular, the linear multiplicative update places no restrictions on the sum of the weights, and, under a wide range of conditions, achieves worst-case behavior close to the EG algorithm. We perform our analysis for square loss and absolute loss, and for regression and classification. We also describe some experiments showing that the performance of our algorithms are comparable to EG and the p-norm algorithm.
منابع مشابه
Multiplicative Updates for L1-Regularized Linear and Logistic Regression
Multiplicative update rules have proven useful in many areas of machine learning. Simple to implement, guaranteed to converge, they account in part for the widespread popularity of algorithms such as nonnegative matrix factorization and Expectation-Maximization. In this paper, we show how to derive multiplicative updates for problems in L1-regularized linear and logistic regression. For L1–regu...
متن کاملMultiplicative Updates for Large Margin Classifiers
Various problems in nonnegative quadratic programming arise in the training of large margin classifiers. We derive multiplicative updates for these problems that converge monotonically to the desired solutions for hard and soft margin classifiers. The updates differ strikingly in form from other multiplicative updates used in machine learning. In this paper, we provide complete proofs of conver...
متن کاملOn multiplicative (strong) linear preservers of majorizations
In this paper, we study some kinds of majorizations on $textbf{M}_{n}$ and their linear or strong linear preservers. Also, we find the structure of linear or strong linear preservers which are multiplicative, i.e. linear or strong linear preservers like $Phi $ with the property $Phi (AB)=Phi (A)Phi (B)$ for every $A,Bin textbf{M}_{n}$.
متن کاملMultiplicative updates for non-negative projections
We present here how to construct multiplicative update rules for non-negative projections based on Oja’s iterative learning rule. Our method integrates the multiplicative normalization factor into the original additive update rule as an additional term which generally has a roughly opposite direction. As a consequence, the modified additive learning rule can easily be converted to its multiplic...
متن کاملMultiplicative Updates for Nonnegative Quadratic Programming
Many problems in neural computation and statistical learning involve optimizations with nonnegativity constraints. In this article, we study convex problems in quadratic programming where the optimization is confined to an axis-aligned region in the nonnegative orthant. For these problems, we derive multiplicative updates that improve the value of the objective function at each iteration and co...
متن کامل